Appeared in Neural Networks 1998 Automatic Early Stopping Using Cross Validation: Quantifying the Criteria
نویسنده
چکیده
Cross validation can be used to detect when over tting starts during supervised training of a neural network; training is then stopped before convergence to avoid the overtting (\early stopping"). The exact criterion used for cross validation based early stopping, however, is chosen in an ad-hoc fashion by most researchers or training is stopped interactively. To aid a more well-founded selection of the stopping criterion, 14 di erent automatic stopping criteria from 3 classes were evaluated empirically for their e ciency and e ectiveness in 12 di erent classi cation and approximation tasks using multi layer perceptrons with RPROP training. The experiments show that on the average slower stopping criteria allow for small improvements in generalization (on the order of 4%), but cost about factor 4 longer training time.
منابع مشابه
Automatic early stopping using cross validation: quantifying the criteria
Cross validation can be used to detect when overfitting starts during supervised training of a neural network; training is then stopped before convergence to avoid the overfitting ('early stopping'). The exact criterion used for cross validation based early stopping, however, is chosen in an ad-hoc fashion by most researchers or training is stopped interactively. To aid a more well-founded sele...
متن کاملCluster Analysis of Neural Network Weights for Discrimination of Eeg Signals
Category: Applications. No part of this work has been submitted or appeared in other scientiic conferences. Abstract Neural networks are trained to classify half-second segments of six-channel, EEG data into one of ve classes corresponding to ve cog-nitive tasks performed by one subject. Two and three-layer feed-forward neural networks are trained using 10-fold cross-validation and early stoppi...
متن کاملAsymptotic statistical theory of overtraining and cross-validation
A statistical theory for overtraining is proposed. The analysis treats general realizable stochastic neural networks, trained with Kullback-Leibler divergence in the asymptotic case of a large number of training examples. It is shown that the asymptotic gain in the generalization error is small if we perform early stopping, even if we have access to the optimal stopping time. Based on the cross...
متن کاملClassification of EEG Signals from Four Subjects During Five Mental Tasks
Neural networks are trained to classify half-second segments of six-channel, EEG data into one of five classes corresponding to five cognitive tasks performed by four subjects. Two and three-layer feedforward neural networks are trained using 10-fold cross-validation and early stopping to control over-fitting. EEG signals were represented as autoregressive (AR) models. The average percentage of...
متن کاملDevelopment of Soft Sensor to Estimate Multiphase Flow Rates Using Neural Networks and Early Stopping
This paper proposes a soft sensor to estimate phase flow rates utilizing common measurements in oil and gas production wells. The developed system addresses the limited production monitoring due to using common metering facilities. It offers a cost-effective solution to meet real-time monitoring demands, reduces operational and maintenance costs, and acts as a back-up to multiphase flow meters....
متن کامل